Blockwise acceleration of alternating least squares for canonical tensor decomposition

نویسندگان

چکیده

The canonical polyadic (CP) decomposition of tensors is one the most important tensor decompositions. While well-known alternating least squares (ALS) algorithm often considered workhorse for computing CP decomposition, it known to suffer from slow convergence in many cases and various algorithms have been proposed accelerate it. In this article, we propose a new accelerated ALS that accelerates blockwise manner using simple momentum-based extrapolation technique random perturbation technique. Specifically, our updates factor matrix (i.e., block) at time, as ALS, with each update consisting minimization step directly reduces reconstruction error, an moves along previous direction, breaking bottlenecks. Our strategy takes simpler form than state-of-the-art strategies easier implement. has negligible computational overheads relative apply. Empirically, shows strong performance compared acceleration techniques on both simulated real tensors.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Local Convergence of the Alternating Least Squares Algorithm for Canonical Tensor Approximation

A local convergence theorem for calculating canonical low-rank tensor approximations (PARAFAC, CANDECOMP) by the alternating least squares algorithm is established. The main assumption is that the Hessian matrix of the problem is positive definite modulo the scaling indeterminacy. A discussion, whether this is realistic, and numerical illustrations are included. Also regularization is addressed.

متن کامل

Randomized Alternating Least Squares for Canonical Tensor Decompositions: Application to A PDE With Random Data

This paper introduces a randomized variation of the alternating least squares (ALS) algorithm for rank reduction of canonical tensor formats. The aim is to address the potential numerical ill-conditioning of least squares matrices at each ALS iteration. The proposed algorithm, dubbed randomized ALS, mitigates large condition numbers via projections onto random tensors, a technique inspired by w...

متن کامل

Some Convergence Results on the Regularized Alternating Least-Squares Method for Tensor Decomposition

We study the convergence of the Regularized Alternating Least-Squares algorithm for tensor decompositions. As a main result, we have shown that given the existence of critical points of the Alternating Least-Squares method, the limit points of the converging subsequences of the RALS are the critical points of the least squares cost functional. Some numerical examples indicate a faster convergen...

متن کامل

DMS: Distributed Sparse Tensor Factorization with Alternating Least Squares

Tensors are data structures indexed along three or more dimensions. Tensors have found increasing use in domains such as data mining and recommender systems where dimensions can have enormous length and are resultingly very sparse. The canonical polyadic decomposition (CPD) is a popular tensor factorization for discovering latent features and is most commonly found via the method of alternating...

متن کامل

Tensor Decompositions, Alternating Least Squares and other Tales

This work was originally motivated by a classification of tensors proposed by Richard Harshman. In particular, we focus on simple and multiple “bottlenecks”, and on “swamps”. Existing theoretical results are surveyed, some numerical algorithms are described in details, and their numerical complexity is calculated. In particular, the interest in using the ELS enhancement in these algorithms is d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Numerical Linear Algebra With Applications

سال: 2023

ISSN: ['1070-5325', '1099-1506']

DOI: https://doi.org/10.1002/nla.2516